Learning Representations of Wordforms With Recurrent Networks: Comment on Sibley, Kello, Plaut, & Elman (2008)

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Representations of Wordforms With Recurrent Networks: Comment on

Sibley et al. (2008) report a recurrent neural network model designed to learn wordform representations suitable for written and spoken word identification. The authors claim that their sequence encoder network overcomes a key limitation associated with models that code letters by position (e.g., CAT might be coded as C-in-position-1, A-in-position-2, T-in-position-3). The problem with coding l...

متن کامل

Learning Representations of Wordforms

The forms of words as they appear in text and speech are central to theories and models of lexical processing, yet current means of representing wordforms are lacking in certain key aspects. In the present study, a connectionist model termed the wordformer is presented that learns wordform representations through exposure to strings of stress-marked phonemes or letters. A small-scale simulation...

متن کامل

Sequence Learning with Recurrent Networks: Analysis of Internal Representations

The recognition and learning of temporal sequences is fundamental to cognitive processing. Several recurrent networks attempt to encode past history through feedback connections from \context units". However, the internal representations formed by these networks is not well understood. In this paper, we use cluster analysis to interpret the hidden unit encodings formed when a network with conte...

متن کامل

Learning Performance of Networks like Elman ’ s Simple Recurrent Networks but having Multiple State Vectors

Target Papers: • William H. Wilson, A comparison of architectural alternatives for recurrent networks, Proceedings of the Fourth Australian Conference on Neural Networks, ACNN’93, Melbourne, 13 February 1993, 189-192. ftp://ftp.cse.unsw.edu.au/pub/users/billw/wilson.recurrent.ps.Z • William H. Wilson, Stability of learning in classes of recurrent and feedforward networks, in Proceedings of the ...

متن کامل

Learning Bilingual Phrase Representations with Recurrent Neural Networks

We introduce a novel method for bilingual phrase representation with Recurrent Neural Networks (RNNs), which transforms a sequence of word feature vectors into a fixed-length phrase vector across two languages. Our method measures the difference between the vectors of sourceand target-side phrases, and can be used to predict the semantic equivalence of source and target word sequences in the ph...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Cognitive Science

سال: 2009

ISSN: 0364-0213,1551-6709

DOI: 10.1111/j.1551-6709.2009.01062.x